# Domain-Specific Vocabulary
Secbert
Apache-2.0
SecBERT is a pre-trained language model trained on cybersecurity texts, optimized specifically for cybersecurity domain tasks.
Large Language Model
Transformers English

S
jackaduma
40.03k
52
Custom Legalbert
BERT model optimized for the legal domain, pretrained from scratch on 37GB of legal ruling texts
Large Language Model English
C
casehold
12.59k
12
Secroberta
Apache-2.0
SecRoBERTa is a pretrained language model trained on cybersecurity texts, optimized for tasks in the cybersecurity domain.
Large Language Model
Transformers English

S
jackaduma
16.75k
18
Featured Recommended AI Models